Maximum likelihood discriminant feature spaces
نویسندگان
چکیده
Linear discriminant analysis (LDA) is known to be inappropriate for the case of classes with unequal sample covariances. In recent years, there has been an interest in generalizing LDA to heteroscedastic discriminant analysis (HDA) by removing the equal within-class covariance constraint. This paper presents a new approach to HDA by defining an objective function which maximizes the class discrimination in the projected subspace while ignoring the rejected dimensions. Moreover, we will investigate the link between discrimination and the likelihood of the projected samples and show that HDA can be viewed as a constrained ML projection for a full covariance gaussian model, the constraint being given by the maximization of the projected between-class scatter volume. It will be shown that, under diagonal covariance gaussian modeling constraints, applying a diagonalizing linear transformation (MLLT) to the HDA space results in increased classification accuracy even though HDA alone actually degrades the recognition performance. Experiments performed on the Switchboard and Voicemail databases show a 10%-13% relative improvement in the word error rate over standard cepstral processing.
منابع مشابه
Review on Heteroscedastic Discriminant Analysis
Discriminant feature spaces are attractive way to improve the word error rate performance of the speech recognition systems. Heteroscedastic discriminant analysis (HDA) is a generalized method for the feature space transformation that does not impose the equa l w i th in c l a s s cova r i ance assumptions required by the standard linear discriminant analysis (LDA). It will be shown that the co...
متن کاملMaximum Echo-State-Likelihood Networks for Emotion Recognition
Maximum Echo-State-Likelihood Networks for Emotion Recognition Edmondo Trentin, Stefan Scherer, aand Friedhelm Schwenker Evaluation of Feature Selection by Multiclass Kernel Discriminant Analysis Tsuneyoshi Ishii and Shigeo Abe Correlation-Based and Causal Feature Selection Analysis for Ensemble Classifiers Rakkrit Duangsoithong and Terry Windeatt A New Monte Carlo-based Error Rate Estimator Ah...
متن کاملConstrained Maximum Likelihood Modeling with Gaussian Distributions
Maximum Likelihood (ML) modeling of multiclass data using gaussian distributions for classification often suffers from the following problems: a) data insufficiency implying overtrained or unreliable models b) large storage requirement c) large computational requirement and/or d) ML is not discriminating between classes. Sharing parameters across classes (or constraining the parameters) clearly...
متن کاملEffect Of Radiance-To-Reflectance Transformation And Atmosphere Removal On Maximum Likelihood Classification Accuracy Of High-Dimensional Remote Sensing Data
Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum likelihood classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpf...
متن کاملSupervised Learning of Acoustic Models in a Zero Resource Setting to Improve DPGMM Clustering
In this work we utilize a supervised acoustic model training pipeline without supervision to improve Dirichlet process Gaussian mixture model (DPGMM) based feature vector clustering. We exploit methods common in supervised acoustic modeling to unsupervisedly learn feature transformations for application to the input data prior to clustering. The idea is to automatically find mappings of feature...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000